Process Improvement - The Way Forward
نویسنده
چکیده
The paper briefly reviews the difficulties encountered in achieving further major improvement in some aspects of the software evolution process. This may, it is suggested, be due, in part, to the fact that the global process is a complex, multi loop multilevel feedback system. After summarising the results of a 1970s study and the laws of software evolution that arose from it, the paper introduces a FEAST hypothesis. This asserts that the global software process for E-type systems is such a multi loop feedback system and displays the properties associated with such systems. Early results from the FEAST study culminating in the FEAST/1 project started in October 1996 are outlined. The support these results provide for the original five Laws of Software Evolution and for three extensions are also indicated. The FEAST/1 project is to explore the phenomenon in depth through modelling of a number of industrial projects and will seek to demonstrate its impact. The early work of the 1970s concentrating primarily on OS/360 as confirmed by the FEAST results obtained to date suggest that mastery of the process as a feedback system is key to further process improvement. 1 Software Process Improvement 1.1 Overview In their early years digital computers were primarily regarded as tools for the automation of numerical computation. In the subsequent, now over forty, years of evolution of digital computing, applications and their operational domains have been extended in their variety, in the detail in which they are addressed and in the extend to which the computer system is integrated with and into the application, its operational environment and the activities of the humans in that environment. This growth is reflected in the range of functionality of current operational systems. The size, functional complexity and structural complexity of the software developed and successfully used has increased by several orders of magnitude. This is reflected in an even greater increase in the number, variety and operational complexity of the features incorporated in the software. The operational domain, the application, activities in that domain and the characteristics of the humans and mechanisms involved are reflected in the software in ever greater detail. The intimate coupling between the software, the operational environment and the humans in that environment as reflected in the growing complexity of the interface. The search for improvement in the software evolution process can be traced back to the very beginning of digital computers and of programming. In the preface to their book The Preparation of Programs for an Electronic Digital Computer, [wil51] Wilkes, Wheeler and Gill wrote, in connection with their invention of the concept of subroutines, "The methods of preparing programs for the EDSAC described in this book were developed with a view to reducing to a minimum the amount of labour required and hence of making it feasible to use the machine for problems that require only a few hours of computing time as well as for those which require many hours ..." The search for improvement has been in the forefront of programming research and development ever since, though the term process improvement has only come into common usage in the last decade. Innovations relating to attributes such as programming productivity, product quality, process predictability and responsiveness indicate the success of that search. They include high level languages, structured programming, abstract data types, new programming paradigms, formal methods, metrics, CASE, support environments, process modelling, model based process improvement and so on. The list is endless. 9/4/97, 5:11 pm 1 mml565[papers] 1 At the highest level of detail processes of software development and maintenance are equivalent sharing a common underlying paradigm [leh84]. Each, in its own way, achieves system evolution. Hence, unless otherwise stated, the term software evolution process or process in short as used throughout this paper includes both activities. 2 This author's italics There have, unfortunately, been few studies that provide convincing quantitative data on the actual benefit that such innovations yield. The awesome increase since the early days of computing in functional complexity and power of software systems being developed, used and maintained testifies to the fact that major progress has been made. So does the fact that development times and software costs have, in many instances, decreased significantly. But maintenance activity remains as intensive as ever, time to delivery and the ultimate cost of software development projects is still not reliably predictable, functionality and quality of early releases of a system are below specification, often below par. There continue to be too many cases of unsatisfactory or even incorrect system behaviour after installation, too many disastrous failures, too many projects that never deliver. This is not to suggest that individual process innovations did not represent a technological advance. They clearly yielded local improvement in one way or another, that is at the point of application. Use of high level languages, for example, led to increases in the average rate of code generation of order five or so. Further significant benefit resulted from the increased understandability that the use of higher level languages yields and the consequent improvement in the quality, structure, maintainability and evolvability of code. These aspects led to significant reductions in development time, reduced need for fault fixing maintenance and so to growth in productivity over system lifetime. But their adoption has not solved the overall software problem. as indicated in the previous paragraph. Formal methods have prospered in academic research. They provide the rationale and a methodological base for many CASE tools. But their contribution to industrial effectiveness is limited. CASE tools yield local benefit to their users through the methods they make available and by the rigour they impose. Yet they have not made a major impact on industrial software development effectiveness. And so it goes on. Introduction of such innovations has generally been based on the argument that their general adoption by industry will, once and for all, end the software crisis. Failure to live up to this expectation can be rationalised and individually explained. That path is not pursued in the present paper. The fact remains that general experience over innovations, over organisations, over different process structures and application areas is that global benefit derived from the introduction of an individual improvement is much more limited than might be expected from its impact at the local level. This suggests that it will be more profitable to seek a common explanation. It would seem likely that some intrinsic property of the process is constraining improvement of the global process followed to develop or adapt an operational computer system to satisfactorily address market needs. The most obvious explanation is that expectations of innovators are often unreasonable. But that hardly explains the slow progress in achieving major improvement in overall effectiveness of the total process from conception to system operation and its subsequent evolution. Clearly also the introduction of individual languages, methods, tools, paradigms or programming steps or activities impacts only a small portion of the total effort required to take a system from conception to deployment in the field. Consider, for example, doubling the effectiveness, that is halving the duration, of an individual activity through the introduction of a new or improved language, method or tool. In a primitive process that step might have originally absorbed 50% of the total process effort work expended or time required to take the product from conception to installation in the field. The total process effort or duration is therefore reduced by 33%, a visible and significant improvement. In today's more sophisticated industrial processes few, if any, steps represent more than, say, 5% of the total effort from start to finish. Thus the externally visible impact of the same improvement would be of order well under 3%, a trivial improvement relative to the 50% local gain. Indeed it would probably be lost in the noise, the variations that occur from process to process due to variations in environmental, including project, conditions. If improvement is measured in terms of quality a similar argument applies. The impact of the quality achieved in individual steps on global quality is likely to vary considerably. The contributions of individual steps to achieving a quality product and on the effective productivity of individual steps is likely to vary widely between steps but will also be much smaller at the global level than locally. Other explanations are also possible but will not be discussed here [leh95]. Instead, attention is drawn to a constraint which, with hindsight, is almost self evident. The phenomenon from which underlies it was first identified over 25 years ago but only recently has the penny dropped. 9/4/97, 5:11 pm 2 mml565[papers] 1.2 Recent Developments in the Search for Process Improvement The very first recognition of and advance in process improvement was almost certainly in the 1951 book by Wilkes et al as quoted above, referring to their "invention" of the concept and management of subroutines in the context of the Cambridge University EDSAC development [wil51]. Until recently the approach to further improvement was indirect. Research and development interest concentrated, in general, on specific artifacts of or activities in the software development process. Process maintenance received scant attention. The early focus was on procedural programming languages, considering both level and style. With regard to the former, languages have advanced from binary codes and machine languages, through assembly languages to a large variety of (so called) high and very high level languages. Alternative paradigms have also been explored. In general, however, procedural programming has not been displaced in primary industrial usage other approaches, functional programming via transformation or logic programming, for example, have been widely explored and applied. Over the years concepts relating to the syntax, semantics and use of programming languages have emerged. Examples include structured programming [dij68, knu74, you79], successive refinement [wir71] or, more recently, object orientation [boo86]. Together these led the way to more detailed consideration of the overall technical programming process. It was, for example, successively recognised over many years that program development must be preceded by a design activity, that this must be preceded by development of a specification and that this, in turn, must be based on a requirements statement which itself had been derived from a requirements analysis. More recently the need for an initial application domain analysis has been recognised though this may well be just another term for what had previously been termed systems analysis.. Activities to implement these needs were then developed, initially on an ad hoc basis, and added to, in general, a waterfall [roy70, boe76] type process. And so with other aspects of the software process. The need to understand and improve it was recognised from the beginning of the digital computer age [wil51, ben56, leh69, roy70, boe76]. The discussion at the NATO Software Engineering conferences [nau69], [bux70] should also not be overlooked. Wider awareness was triggered by a series of International Process Workshops, the first of which was held in 1984 [spw84]. A keynote lecture by Osterweil [ost86,97] and a response to thereto [leh86,97] triggered strong interest in modelling the process, particularly in academia. The primary focus of the modelling effort was on understanding the process as such. The search for further significant improvement was, however, never far away. More conscious and directed work on process improvement was a direct outcome of research and development at the Software Engineering Institute (SEI) at Carnegie Mellon University. This culminated in their CMM models, related improvement technology and the emergence of an international SPIN (Software Process Improvement Network) movement. 2. The Laws of Software Evolution A 1968/9 study of the IBM programming process [leh69] provided, inter alia, quantitative data on various attributes relating to successive releases of IBM's operating system OS/360 and on the effort expended in going from release to release. As data on further releases became available a series of models reflecting the growth trends of the system and attributes of the process whereby it was being evolved [leh85] were developed. Their analysis led, in turn, to a descriptive phenomenology. This suggested that behaviour reflected in the observations was primarily due to the influences of human and organisational factors. That is, the characteristics of system evolution were largely determined by factors other than the process and technology being used to achieve that evolution. This observation and its implications were summarised and encapsulated in a series of behavioural statements, that were exogenous to the technology being used in the evolution process. From the point of view of the software engineer they were therefore to be viewed as laws.. The first three of these laws were formulated in the mid seventies [leh74] and discussed in greater detail in 1978 [leh78]. Two further laws were introduced in 1980 [leh80a]. A sixth was introduced in a subsequent footnote [leh91]. The remaining two while publicly discussed have only recently been 9/4/97, 5:11 pm 3 mml565[papers] 3 References identified by a * in the listing are reprinted in [leh85]. published [leh96c]. As restated explicitly below the laws relate to E-type systems [leh80b] that is, broadly speaking, to software systems that solve a problem or implement a computer application in the real world.
منابع مشابه
Knowledge Diffusion to Workplace Safety and Health Improvement
The purpose of this study is to develop a conceptual model of Workplace safety and Health Knowledge diffusion. As workplace safety is becoming a global issue in the competitiveness of the business environment, knowledge diffusion model to workplace safety is found a mandatory tool to create awareness of the society and conceptual model is developed. Literature review was conducted in collecting...
متن کاملAn Improvement of Cover/El Gamal's Compress-and-Forward Relay Scheme
The compress-and-forward relay scheme developed by (Cover and El Gamal, 1979) is improved with a modification on the decoding process. The improvement follows as a result of realizing that it is not necessary for the destination to decode the compressed observation of the relay; and even if the compressed observation is to be decoded, it can be more easily done by joint decoding with the origin...
متن کاملProspecting for geothermal energy through satellite based thermal data: Review and the way forward
Geothermal investors need to be confident with the methods and results of exploration programs. Also cutting the upfront cost of geothermal exploration will further encourage investors to consider investment in this emerging clean energy field. Hence, it is of paramount importance to improve prospecting techniques in order to explore where economic concentrations of geothermal energy are to be ...
متن کاملAn Analysis on the Forming Characteristics of commercial pure Aluminum AA 1100 in Radial-Forward Extrusion Process
Abstract: In this paper, the forming process of a central hub by radial-forward extrusion is analyzed by using the finite element software, ABAQUS. Radial-forward extrusion is used to produce hollow parts that generally feature a central hub with radial protrusions. Effective design factors such as mandrel diameter, die corner radius, die fillet radius, mandrel corner radius, tube wall thick...
متن کاملInvestigation the financing strategies of improvement and renovation projects of urban distressed areas
The vast expanse of the urban distressed areas in the country and multiple damages that their inhabitants face on economic, social and cultural aspects remind the need for higher attention to these constructions. Meanwhile, Tehran's distressed areas with more than 3268 hectares (5.3% of the area of Tehran) has a population of one million and three hundred thousand people in it, is one of the ...
متن کاملCurbing variations in packaging process through Six Sigma way in a large-scale food-processing industry
Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the ...
متن کامل